Scalable subspace methods for derivative-free nonlinear least-squares optimization

نویسندگان

چکیده

Abstract We introduce a general framework for large-scale model-based derivative-free optimization based on iterative minimization within random subspaces. present probabilistic worst-case complexity analysis our method, where in particular we prove high-probability bounds the number of iterations before given optimality is achieved. This specialized to nonlinear least-squares problems, with Gauss–Newton method. method achieves scalability by constructing local linear interpolation models approximate Jacobian, and computes new steps at each iteration subspace user-determined dimension. then describe practical implementation this framework, which call DFBGN. outline efficient techniques selecting points search subspace, yielding an that has low per-iteration algebra cost (linear problem dimension) while also achieving fast objective decrease as measured evaluations. Extensive numerical results demonstrate DFBGN improved scalability, strong performance problems.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequential Penalty Derivative-Free Methods for Nonlinear Constrained Optimization

We consider the problem of minimizing a continuously differentiable function of several variables subject to smooth nonlinear constraints. We assume that the first order derivatives of the objective function and of the constraints can be neither calculated nor approximated explicitly. Hence, every minimization procedure must use only a suitable sampling of the problem functions. These problems ...

متن کامل

Inner-Iteration Krylov Subspace Methods for Least Squares Problems

Stationary inner iterations in combination with Krylov subspace methods are proposed for least squares problems. The inner iterations are efficient in terms of computational work and memory, and serve as powerful preconditioners also for ill-conditioned and rank-deficient least squares problems. Theoretical justifications for using the inner iterations as preconditioners are presented. Numerica...

متن کامل

A Derivative-Free Algorithm for Least-Squares Minimization

We develop a framework for a class of derivative-free algorithms for the least-squares minimization problem. These algorithms are designed to take advantage of the problem structure by building polynomial interpolation models for each function in the least-squares minimization. Under suitable conditions, global convergence of the algorithm is established within a trust region framework. Promisi...

متن کامل

Parallel Tensor Methods for Nonlinear Equations and Nonlinear Least Squares

We describe the design and computational performance of parallel row-oriented tensor algorithms for the solution of dense systems of nonlinear equations and nonlinear least squares problems on a distributed-memory MIMD multiprocessor. Tensor methods are general purpose methods that base each iteration upon a quadratic model of the nonlinear function, rather than the standard linear model, where...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Programming

سال: 2022

ISSN: ['0025-5610', '1436-4646']

DOI: https://doi.org/10.1007/s10107-022-01836-1